|
In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. == Motivation == For simplicity, it will be assumed that all objects in the article are finite-dimensional. We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution ''P'' = , but somehow we mistakenly assumed it to be ''Q'' = . For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the ''j''-th event, or equivalently, the amount of information provided after observing the ''j''-th event, is : The (assumed) average uncertainty of all possible events is then : On the other hand, the Shannon entropy of the probability distribution ''p'', defined by : is the real amount of uncertainty before observation. Therefore the difference between these two quantities : is a measure of the distinguishability of the two probability distributions ''p'' and ''q''. This is precisely the classical relative entropy, or Kullback–Leibler divergence: : Note #In the definitions above, the convention that 0·log 0 = 0 is assumed, since lim''x'' → 0 ''x'' log ''x'' = 0. Intuitively, one would expect that an event of zero probability to contribute nothing towards entropy. #The relative entropy is not a metric. For example, it is not symmetric. The uncertainty discrepancy in mistaking a fair coin to be unfair is not the same as the opposite situation. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Quantum relative entropy」の詳細全文を読む スポンサード リンク
|